471 research outputs found

    Images of connected sets by semicontinuous multifunctions

    Get PDF
    AbstractThe image of a connected set by an upper-semicontinuous (or a lower-semicontinuous) multifunction whose values are nonempty and connected is connected. We prove this theorem in its most general setting and show its usefulness in various examples from optimization and nonlinear analysis

    Avant-propos

    Get PDF

    Permanently going back and forth between the “quadratic world” and the “convexity world” in optimization

    Get PDF
    The objective of this work is twofold. Firstly, we propose a review of different results concerning convexity of images by quadratic mappings, putting them in a chronological perspective. Next, we enlighten these results from a geometrical point of view in order to provide new and comprehensive proofs and to immerse them in a more general and abstract context

    Statistically optimal analysis of state-discretized trajectory data from multiple thermodynamic states

    Get PDF
    We propose a discrete transition-based reweighting analysis method (dTRAM) for analyzing configuration-space-discretized simulation trajectories produced at different thermodynamic states (temperatures, Hamiltonians, etc.) dTRAM provides maximum-likelihood estimates of stationary quantities (probabilities, free energies, expectation values) at any thermodynamic state. In contrast to the weighted histogram analysis method (WHAM), dTRAM does not require data to be sampled from global equilibrium, and can thus produce superior estimates for enhanced sampling data such as parallel/simulated tempering, replica exchange, umbrella sampling, or metadynamics. In addition, dTRAM provides optimal estimates of Markov state models (MSMs) from the discretized state-space trajectories at all thermodynamic states. Under suitable conditions, these MSMs can be used to calculate kinetic quantities (e.g. rates, timescales). In the limit of a single thermodynamic state, dTRAM estimates a maximum likelihood reversible MSM, while in the limit of uncorrelated sampling data, dTRAM is identical to WHAM. dTRAM is thus a generalization to both estimators

    A Fresh Variational-Analysis Look at the Positive Semidefinite Matrices World

    Get PDF
    International audienceEngineering sciences and applications of mathematics show unambiguously that positive semidefiniteness of matrices is the most important generalization of non-negative real num- bers. This notion of non-negativity for matrices has been well-studied in the literature; it has been the subject of review papers and entire chapters of books. This paper reviews some of the nice, useful properties of positive (semi)definite matrices, and insists in particular on (i) characterizations of positive (semi)definiteness and (ii) the geometrical properties of the set of positive semidefinite matrices. Some properties that turn out to be less well-known have here a special treatment. The use of these properties in optimization, as well as various references to applications, are spread all the way through. The "raison d'ĂȘtre" of this paper is essentially pedagogical; it adopts the viewpoint of variational analysis, shedding new light on the topic. Important, fruitful, and subtle, the positive semidefinite world is a good place to start with this domain of applied mathematics

    When some variational properties force convexity

    Get PDF
    Abstract The notion of adequate (resp. strongly adequate) function has been recently introduced to characterize the essentially strictly convex (resp. essentially firmly subdifferentiable) functions among the weakly lower semicontinuous (resp. lower semicontinuous) ones. In this paper we provide various necessary and sufficient conditions in order that the lower semicontinuous hull of an extended real-valued function on a reflexive Banach space is essentially strictly convex. Some new results on nearest (farthest) points are derived from this approach. keywords Convex duality, well posed optimization problem, essential strict convexity, essential smoothness, best approximation

    On representations of the feasible set in convex optimization

    Full text link
    We consider the convex optimization problem min⁥{f(x):gj(x)≀0,j=1,...,m}\min \{f(x) : g_j(x)\leq 0, j=1,...,m\} where ff is convex, the feasible set K is convex and Slater's condition holds, but the functions gjg_j are not necessarily convex. We show that for any representation of K that satisfies a mild nondegeneracy assumption, every minimizer is a Karush-Kuhn-Tucker (KKT) point and conversely every KKT point is a minimizer. That is, the KKT optimality conditions are necessary and sufficient as in convex programming where one assumes that the gjg_j are convex. So in convex optimization, and as far as one is concerned with KKT points, what really matters is the geometry of K and not so much its representation.Comment: to appear in Optimization Letter

    Local distinguishability of quantum states in infinite dimensional systems

    Full text link
    We investigate local distinguishability of quantum states by use of the convex analysis about joint numerical range of operators on a Hilbert space. We show that any two orthogonal pure states are distinguishable by local operations and classical communications, even for infinite dimensional systems. An estimate of the local discrimination probability is also given for some family of more than two pure states

    New approximations for the cone of copositive matrices and its dual

    Full text link
    We provide convergent hierarchies for the cone C of copositive matrices and its dual, the cone of completely positive matrices. In both cases the corresponding hierarchy consists of nested spectrahedra and provide outer (resp. inner) approximations for C (resp. for its dual), thus complementing previous inner (resp. outer) approximations for C (for the dual). In particular, both inner and outer approximations have a very simple interpretation. Finally, extension to K-copositivity and K-complete positivity for a closed convex cone K, is straightforward.Comment: 8
    • 

    corecore